Best approximation by Heaviside perceptron networks

نویسندگان

  • Paul C. Kainen
  • Vera Kurková
  • Andrew Vogt
چکیده

In Lp-spaces with p an integer from [1, infinity) there exists a best approximation mapping to the set of functions computable by Heaviside perceptron networks with n hidden units; however for p an integer from (1, infinity) such best approximation is not unique and cannot be continuous.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Replacing points by compacta in neural network approximation

It is shown that cartesian product and pointwise-sum with a fixed compact set preserve various approximation-theoretic properties. Results for pointwise-sum are proved for F -spaces and so hold for any normed linear space, while the other results hold in general metric spaces. Applications are given to approximation of Lp-functions on the d-dimensional cube, 1 ≤ p < ∞, by linear combinations of...

متن کامل

Approximation of functions by perceptron networks with bounded number of hidden units

We examine the e ect of constraining the number of hidden units For one hidden layer networks with fairly general type of units including perceptrons with any bounded activation function and radial basis function units we show that when also the size of parameters is bounded the best approximation property is satis ed which means that there always exists a parameterization achieving the global ...

متن کامل

Estimates of the Number of Hidden Units and Variation with Respect to Half-Spaces

We estimate variation with respect to half-spaces in terms of "flows through hyperplanes". Our estimate is derived from an integral representation for smooth compactly supported multivariable functions proved using properties of the Heaviside and delta distributions. Consequently we obtain conditions which guarantee approximation error rate of order O by one-hidden-layer networks with n sigmoid...

متن کامل

Best one-sided L1 approximation to the Heaviside and sign functions

We find the polynomials of the best one-sided approximation to the Heaviside and sign functions. The polynomials are obtained by Hermite interpolation at the zeros of some Jacobi polynomials. Also we give an estimate of the error of approximation and characterize the extremal points of the convex set of the best approximants. c ⃝ 2012 Elsevier Inc. All rights reserved.

متن کامل

Optimal Pareto Parametric Analysis of Two Dimensional Steady-State Heat Conduction Problems by MLPG Method

Numerical solutions obtained by the Meshless Local Petrov-Galerkin (MLPG) method are presented for two dimensional steady-state heat conduction problems. The MLPG method is a truly meshless approach, and neither the nodal connectivity nor the background mesh is required for solving the initial-boundary-value problem. The penalty method is adopted to efficiently enforce the essential boundary co...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Neural networks : the official journal of the International Neural Network Society

دوره 13 7  شماره 

صفحات  -

تاریخ انتشار 2000